Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Single image super-resolution algorithm based on unified iterative least squares regulation
ZHAO Xiaole, WU Yadong, TIAN Jinsha, ZHANG Hongying
Journal of Computer Applications    2016, 36 (3): 800-805.   DOI: 10.11772/j.issn.1001-9081.2016.03.800
Abstract449)      PDF (984KB)(419)       Save
Machine learning based image Super-Resolution (SR) has been proved to be a promising single-image SR technology, in which sparseness representation and dictionary learning has become the hotspot. Aiming at the time-consuming dictionary training and low-accuracy SR recovery, an SR algorithm was proposed from the perspective of reducing the inconsistency between Low-Resolution (LR) feature and High-Resolution (HR) feature spaces as far as possible. The authors adopted Iterative Least Squares Dictionary Learning Algorithm (ILS-DLA) to train LR/HR dictionaries and Anchored Neighborhood Regression (ANR) to recover HR images. ILS-DLA was able to train LR/HR dictionaries in relatively short time because of its integral optimization procedure, by adopting the same optimization strategy of ANR, which theoretically reduced the diversity between LR/HR dictionaries effectively. A large number of experiments show that the proposed method achieves superior dictionary learning to K-means Singular Value Decomposition ( K-SVD) and Beta Process Joint Dictionary Learning (BPJDL) algorithms etc., and provides better image restoration results than other state-of-the-art SR algorithms.
Reference | Related Articles | Metrics
No-reference image quality assessment based on scale invariance
TIAN Jinsha, HAN Yongguo, WU Yadong, ZHAO Xiaole, ZHANG Hongying
Journal of Computer Applications    2016, 36 (3): 789-794.   DOI: 10.11772/j.issn.1001-9081.2016.03.789
Abstract501)      PDF (1088KB)(398)       Save
The existing general no-reference image quality assessment methods mostly use machine learning method to learn regression models from training images with associated human subjective scores to predict the perceptual quality of testing image. However, such opinion-aware methods expend much time on training, and rely on the distortion types of the training database. These methods have weak generalization capability, hereby limiting their usability in practice. To solve the database dependence, a normalized scale invariance based no-reference image quality assessment method was proposed. In the proposed method, the Natural Scene Statistic (NSS) feature and edge characteristic were combined as the valid features for image quality assessment, and no extra information was required beyond the testing image, then the two feature vectors were used to compute the global difference across scales as the image quality score. The experimental results show that the proposed method has good evaluation for multi-distorted images with low computational complexity. Compared to the state-of-the-art no-reference image quality assessment models, the proposed method has better comprehensive performance, and it is suitable for applications.
Reference | Related Articles | Metrics
Polynomial interpolation algorithm framework based on osculating polynomial approximation
ZHAO Xiaole, WU Yadong, ZHANG Hongying, ZHAO Jing
Journal of Computer Applications    2015, 35 (8): 2266-2273.   DOI: 10.11772/j.issn.1001-9081.2015.08.2266
Abstract509)      PDF (1379KB)(348)       Save

Polynomial interpolation technique is a common approximation method in approximation theory, which is widely used in numerical analysis, signal processing, and so on. Traditional polynomial interpolation algorithms are mainly developed by combining numerical analysis with experimental results, lacking of unified theoretical description and regular solution. A uniform theoretical framework for polynomial interpolation algorithm based on osculating polynomial approximation theory was proposed here. Existing interpolation algorithms could be analyzed and new algorithms could be developed under this framework, which consists of the number of sample points, osculating order for sample points and derivative approximation rules. The presentation of existing mainstream interpolation algorithms was analyzed in proposed framework, and the general process for developing new algorithms was shown by using a four-point and two-order osculating polynomial interpolation. Theoretical analysis and numerical experiments show that almost all mainstream polynomial interpolation algorithms belong to osculating polynomial interpolation, and their effects are strongly related to the number of sampling points, order of osculating, and derivative approximation rules.

Reference | Related Articles | Metrics
Sparse tracking algorithm based on multi-feature fusion
HU Shaohua XU Yuwei ZHAO Xiaolei HE Jun
Journal of Computer Applications    2014, 34 (8): 2380-2384.   DOI: 10.11772/j.issn.1001-9081.2014.08.2380
Abstract374)      PDF (927KB)(394)       Save

This paper proposed a novel sparse tracking method based on multi-feature fusion to compensate for incomplete description of single feature. Firstly, to fuse various features, multiple feature descriptors of dictionary templates and particle candidates were encoded as the form of kernel matrices. Secondly, every candidate particle was sparsely represented as a linear combination of all atoms of dictionary. Then the sparse representation model was efficiently solved using a Kernelizable Accelerated Proximal Gradient (KAPG) method. Lastly, in the framework of particle filter, the weights of particles were determined by sparse coefficient reconstruction errors to realize tracking. In the tracking step, a template update strategy which employed incremental subspace learning was introduced. The experimental results show that, compared with the related state-of-the-art methods, this algorithm improves the tracking accuracy under all kinds of factors such as occlusions, illumination changes, pose changes, background clutter and viewpoint variation.

Reference | Related Articles | Metrics
Reverse curvature-driven super-resolution algorithm based on Taylor formula
ZHAO Xiaole WU Yadong ZHANG Hongying ZHAO Jing
Journal of Computer Applications    2014, 34 (12): 3570-3575.  
Abstract164)      PDF (948KB)(575)       Save

To solve the problem of traditional interpolation and model-based methods usually leading to decrease of the contrast and sharpness of images, a reverse curvature-driven Super-Resolution (SR) algorithm based on Taylor formula was proposed. The algorithm used the Taylor formula to estimate the change trend of image intensity, and then the image edge features were detailed by the curvature of isophote. Gradients were used as constraints to inhibit the jagged edges and ringing effects. The experimental resluts show that the proposed algorithm has obvious advantages over the conventional interpolation algorithm and model-based methods in clarity and information retention, and its result is more in line with human visual effects. The proposed algorithm is more effective than traditional iterative algorithms for reverse diffusion based on Taylor expansion is implemented.

Reference | Related Articles | Metrics